A Computationally Efficient P-LRU based Optimal Cache Heap Object Replacement Policy
نویسندگان
چکیده
The recent advancement in the field of distributed computing depicts a need of developing highly associative and less expensive cache memories for the state-of-art processors i.e., Intel Core i6, i7, etc. Hence, various conventional studies introduced cache replacement policies which are one of the prominent key factors to determine the effectiveness of a cache memory. Most of the conventional cache replacement algorithms are found to be as not so efficient on memory management and complexity analysis. Therefore, a significant and thorough analysis is required to suggest a new optimal solution for optimizing the state-of-the-art cache replacement issues. The proposed study aims to conceptualize a theoretical model for optimal cache heap object replacement. The proposed model incorporates Tree based and MRU (Most Recently Used) pseudoLRU (Least Recently Used) mechanism and configures it with JVM’s garbage collector to replace the old referenced objects from the heap cache lines. The performance analysis of the proposed system illustrates that it outperforms the conventional state of art replacement policies with much lower cost and complexity. It also depicts that the percentage of hits on cache heap is relatively higher than the conventional technologies. Keywords—cache heap object replacement; garbage collectors; Java Virtual Machine; pseudo LRU
منابع مشابه
Reduction in Cache Memory Power Consumption based on Replacement Quantity
Today power consumption is considered to be one of the important issues. Therefore, its reduction plays a considerable role in developing systems. Previous studies have shown that approximately 50% of total power consumption is used in cache memories. There is a direct relationship between power consumption and replacement quantity made in cache. The less the number of replacements is, the less...
متن کاملReduction in Cache Memory Power Consumption based on Replacement Quantity
Today power consumption is considered to be one of the important issues. Therefore, its reduction plays a considerable role in developing systems. Previous studies have shown that approximately 50% of total power consumption is used in cache memories. There is a direct relationship between power consumption and replacement quantity made in cache. The less the number of replacements is, the less...
متن کاملPerformance Improvement of Least-Recently- Used Policy in Web Proxy Cache Replacement Using Supervised Machine Learning
Web proxy caching is one of the most successful solutions for improving the performance of Web-based systems. In Web proxy caching, Least-Recently-Used (LRU) policy is the most common proxy cache replacement policy, which is widely used in Web proxy cache management. However, LRU are not efficient enough and may suffer from cache pollution with unwanted Web objects. Therefore, in this paper, LR...
متن کاملHighly Efficient LRU Implementations for High Associativity Cache Memory
High associativity with replacement policy as LRU is an optimal solution for cache design when miss rate has to be reduced. But when associativity increases, implementing LRU policy becomes complex. As many advance and demanding technologies like multimedia, multithreading, database and low power devices running on high performance processors in servers and work stations use higher associativit...
متن کاملLeast recently plus five least frequently replacement policy (LR+5LF)
In this paper, we present a new block replacement policy in which we proposed a new efficient algorithm for combining two important policies Least Recently Used (LRU) and Least Frequently Used (LFU). The implementation of the proposed policy is simple. It requires limited calculations to determine the victim block. We proposed our models to implement LRU and LFU policies. The new policy gives e...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017